perm filename UNIFY.ABS[P,JRA]1 blob
sn#148471 filedate 1975-03-03 generic text, type C, neo UTF8
COMMENT ⊗ VALID 00003 PAGES
C REC PAGE DESCRIPTION
C00001 00001
C00002 00002 difficulties in programming
C00012 00003 Try more intuitive approach
C00016 ENDMK
C⊗;
difficulties in programming
how do you get a correct program. consider unify; cannot go to
published algorithm Luckham has bug. but unify is basically a
computationally defined algorithm (not like factorial, say).
so how do you do it? verification of completely constructed program,
regardless of the language involved, is a loss. You must get the
certifying process into the construction process. However this is
only part of the problem. The primary concern must be the proper
construction of programs at the intuitive level.. the proper
education of programmers, if you wish. For it will be many years
before our formal tools for certification approach our ability to
describe algorithms. Compare mathematics. Informal but convincing
tools for mathematical reasoning are the rule, the formal proofs are
the exceptions. If practicing mathematician were required to publish
formal proofs, a significant percentage of research would not be
going on. Note that incorrect "proofs" are published in mathematics.
We really cannot expect much better results in programming. The task
of programming is not easy, and the training and perspectives of the
ordinary programmer do not approach those of the mathematical
trained. What can we expect realistically? Compare mathematics
again. What is the role of undergraduate education in mathematics?
It's not technical competence, it's cultivation of ways of
thinking.."mathematical sophistication" if you wish. Computer
Science curricula must become more aware of the need for a similar
degree of "sophistication" in the area of programming.
"Methodology" is the wrong word to use in the context of programming.
There are no "rules or postulates" for program construction anymore
than there are rules for finding mathematical proofs. There are
simple syntactic rules, just as there are formal deductive rules for
parts of formal mathematics which characterize logically valid
deductions. But these rules say nothing about proof discovery,
anymore than we can expect rules for program construction. The
motivation for the discussion of structured programming( note
"programming" not "programs") was the attempt to develop a
"sophistication" for programming. The current de-basing of the
terminology stems from a serous misunderstanding.. attempting to turn
style into dogma.
So proper construction of programs stems from two ideas: (1) a proper
education in programming "style", and (2) a programming environment
which supports and reinforces this style. Examining one without the
other will not work. The best approach it seems then is to describe a
style of programming which will be conducive to clarity and
correctness and concurrently point out those desirable system
features which reinforce this process.
The programming sytle is a direct derivative of McCarthy's ideas on
abstraction. If structuring is to mean anything, abstraction is the
key idea....more...
The example we will take is Robinson's unification routine. Emphasis
will be on construction of simple programs, operating on heavily
abstracted data; as we become convinced of the correctness of each
level of progam, we procede to an elaboration of one of the
abstractions. (cf. dijkstra). We pick non-numerical examples since
it is in these domains that the full power of the method becomes
apparent.
We will attempt to go directly from the description given in luckham
to a machineable version.
unify[l] <= unify1[l,()]
;so far we assume noting about the structure
of "l"; we assume that the second argument of
unify1 -a substitution- is a sequence, though
the definition only presupposes that it is a
set. (N.B. obviously there are restrictions on
"l", just as substitutions are not arbitrary
sequences, but at this level of elaboration
the substructures are irrelevant.
unify1[l;s] <= {
lab: equality[l;s] → return[s];
x ← lo[das[apply_subs_lits[s;l]]];
[var[first[x]]∧¬ocr[first[x];second[x]]
→ s ← new_subs[first[x];second[x];s]]; go [lab];]
T → return[FALSE] }
;now where are we? What new information have
we aded? The use of "first" and "second" imply
a sequence character to the effect of thee
function "lo". We should be able to give a
convincing argument that unify1 realizes the
(intended scheme of steps(2)-(3) in luckham)
This encoding should also illuminate the
"coding error" in that paper. What we need do
now is supply the missing routines,
elaborating the data structures, while the
system is checking for consistency of types,
assumptions of data structuring,... and
general low-level bookeekping.
;go after "equality[l;s]";
l: set of literals -- assume sequence representation
s: substitution -- represented as sequence
equality[l;s] <=
[null[l] → T;
T → equality1[apply_sub[s;first[l]];rest[l];s]
]
equality1[target;l;s] <=
[null[l] → T;
target = apply_sub[s;first[l]] → equality1[target;rest[l];s]
T → NIL
]
apply_sub[s;lit] <= mklit[predlet[lit];apply_sub_terms[s;terms[lit]]]
apply_sub_terms[s;tms] <=
[null[tms]→ () ;assuming sequence representation now
T → concat[apply_sub_term[s;first[tms]];
apply_sub_terms[s;rest[tms]] ]
]
apply_sub_term[s;term] <=
[isvar[term] → lookup[term;s];
isconst[term] → term;
is_fun_term[term] → mkterm[fun_let[term];
apply_sub_terms[s;args[term]]]
]
lookup[var;s] <=
[null[s] → var;
var = name[first[s]] → val[first[s]];
T → lookup[vars; rest[s]]
]
;elaboration of equality is finished
;back to top level
; next, begin elaboration of "lo[das[apply_sub_lits[s;l]]]",
apply_sub_lits[s;l] <=
[null[l] →()
T →concat[apply_sub[s;first[l]];
apply_sub_lits[s;rest[l]]]
]
das[l] <=
[
Try more intuitive approach
Literals l1 and l2 are unifiable if there exists a substitution, s,
such that application of s to l1 yields a literal equal to the
application of s to l2.
is_unifiable[l1;l2]
terms are either variables, constants or function applications.
is_term[t]
a substitution, s, is a set of ordered pairs: variable:vi-term:ti such that
each vi is distinct and no vi=ti.
issub[s; v1; v2; ...vn; t1; ... tn]
application of substitution s- issub[s; v1;...vn; t1; ... tn]
to a term, t: a simultaneous replacement
of any vi's which occur in t by their corresponding ti's
is_application[t_new;s;t]
unify[l1;l2] <= unify1[terms[l1];terms[l2];()]
;; if is_unifiable[l1;l2] is true
the unifying substitution is result of unify
apply_sub_lit[unify[l1;l2];l1] = apply_sub_lit[unify[l1;l2]l2]
unify1[t1;t2;s] <=
[empty[t1] → s;
λ[x][null[x]→ NIL;
T → unify1[rest[t1];rest[t2];compose[x;s]]]
[unify2[apply_sub_term[first[t1];s];apply_sub_term[first[t2];s]]]
;; on exit; apply_sub_terms[t1;s] = apply_sub_terms[t2;s] or
NIL is returned and termlists not unifiable.
unify2[t1;t2] <=
[is_var[x] → [is_var[y] → [ x=y → ε; T → mk_subs[x;y]]
is_fun_app[y] → [ocr[x;y] → NIL;
T → mk_subs[x;y] ]
[is_fun_app[x] →[is_var[y] → [ocr[y;x] → NIL
T → mk_subs[y;x] ]
is_fun_app[y] → [fun[x] ≠ fun[y] → NIL;
T → unify1[args[x];args[y];()]]
]
apply_sub[s;lit] <= mklit[predlet[lit];apply_sub_terms[s;terms[lit]]]
apply_sub_terms[s;tms] <=
[null[tms]→ () ;assuming sequence representation now
T → concat[apply_sub_term[s;first[tms]];
apply_sub_terms[s;rest[tms]] ]
]
apply_sub_term[s;term] <=
[isvar[term] → lookup[term;s];
is_empty[term] → empty;
is_fun_term[term] → mkterm[fun_let[term];
apply_sub_terms[s;args[term]]]
]
lookup[var;s] <=
[null[s] → var;
var = name[first[s]] → val[first[s]];
T → lookup[vars;rest[s]]
]